Margins, Shrinkage, and Boosting
نویسنده
چکیده
This manuscript shows that AdaBoost and its immediate variants can produce approximate maximum margin classifiers simply by scaling step size choices with a fixed small constant. In this way, when the unscaled step size is an optimal choice, these results provide guarantees for Friedman’s empirically successful “shrinkage” procedure for gradient boosting (Friedman, 2000). Guarantees are also provided for a variety of other step sizes, affirming the intuition that increasingly regularized line searches provide improved margin guarantees. The results hold for the exponential loss and similar losses, most notably the logistic loss.
منابع مشابه
Microleakage assessment of one- and two-step self-etch adhesive systems with the low shrinkage composites
BACKGROUND AND AIM: Different studies evaluating one-step self-etch (SE) adhesive systems show contradictory findings, so the aim of this study was to compare the microleakage of one-step SE adhesive systems and CLEARFIL SE BOND (that serves as the “gold-standard” SE adhesive) with low shrinkage composites. METHODS: In this in vitro study, Class V cavities with the occlusal margin in enamel and...
متن کاملAnalyzing Margins in Boosting
While the success of boosting or voting methods has been evident from experimental data [11], questions about why boosting does not overfit on training data remain. One idea about the effectiveness of boosting was given by Schapire et al. in which they presented an explanation of why the test error of a boosting classifier does not increase with its size by looking at margins. They showed that ...
متن کاملThe tissue shrinkage phenomenon on surgical margins in oral and oropharyngeal squamous cell carcinoma
Aim: One of the most important factors associated with recurrence rate and overall survival is the status of surgical margin of resection free of disease. However, sometimes, the margins measured intra-operatively at the time of surgery differ of those measured by the pathologist in the histopathologic analysis. Faced with this dilemma, a literature review of the best available evidence was con...
متن کاملA more robust boosting algorithm
We present a new boosting algorithm, motivated by the large margins theory for boosting. We give experimental evidence that the new algorithm is significantly more robust against label noise than existing boosting algorithm.
متن کاملScaling Boosting by Margin-Based Inclusionof Features and Relations
Boosting is well known to increase the accuracy of propositional and multi-relational classification learners. However, the base learner’s efficiency vitally determines boosting’s efficiency since the complexity of the underlying learner is amplified by iterated calls of the learner in the boosting framework. The idea of restricting the learner to smaller feature subsets in order to increase ef...
متن کامل